QBoost: Large Scale Classifier Training with Adiabatic Quantum Optimization

نویسندگان

  • Hartmut Neven
  • Vasil S. Denchev
  • Geordie Rose
  • William G. Macready
چکیده

We introduce a novel discrete optimization method for training in the context of a boosting framework for large scale binary classifiers. The motivation is to cast the training problem into the format required by existing adiabatic quantum hardware. First we provide theoretical arguments concerning the transformation of an originally continuous optimization problem into one with discrete variables of low bit depth. Next we propose QBoost as an iterative training algorithm in which a subset of weak classifiers is selected by solving a hard optimization problem in each iteration. A strong classifier is incrementally constructed by concatenating the subsets of weak classifiers. We supplement the findings with experiments on one synthetic and two natural data sets and compare against the performance of existing boosting algorithms. Finally, by conducting a quantum Monte Carlo simulation we gather evidence that adiabatic quantum optimization is able to handle the discrete optimization problems generated by QBoost.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Training a Large Scale Classifier with the Quantum Adiabatic Algorithm

In a previous publication we proposed discrete global optimization as a method to train a strong binary classifier constructed as a thresholded sum over weak classifiers. Our motivation was to cast the training of a classifier into a format amenable to solution by the quantum adiabatic algorithm. Applying adiabatic quantum computing (AQC) promises to yield solutions that are superior to those w...

متن کامل

Training a Binary Classifier with the Quantum Adiabatic Algorithm

This paper describes how to make the problem of binary classification amenable to quantum computing. A formulation is employed in which the binary classifier is constructed as a thresholded linear superposition of a set of weak classifiers. The weights in the superposition are optimized in a learning process that strives to minimize the training error as well as the number of weak classifiers u...

متن کامل

Developing Quantum Annealer Driven Data Discovery

Machine learning applications are limited by computational power. In this paper, we gain novel insights into the application of quantum annealing (QA) to machine learning (ML) through experiments in natural language processing (NLP), seizure prediction, and linear separability testing. These experiments are performed on QA simulators and early-stage commercial QA hardware and compared to an unp...

متن کامل

Investigating the Performance of an Adiabatic Quantum Optimization Processor

We calculate median adiabatic times (in seconds) of a specific superconducting adiabatic quantum processor for an NP-hard Ising spin glass instance class with up to N = 128 binary variables. To do so, we ran high performance Quantum Monte Carlo simulations on a large-scale Internet-based computing platform. We compare the median adiabatic times with the median running times of two classical sol...

متن کامل

Adiabatic quantum optimization fails for random instances of NP-complete problems

Abstract Adiabatic quantum optimization has attracted a lot of attention because small scale simulations gave hope that it would allow to solve NP-complete problems efficiently. Later, negative results proved the existence of specifically designed hard instances where adiabatic optimization requires exponential time. In spite of this, there was still hope that this would not happen for random i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012